Attractor Networks for Shape Recognition

نویسندگان

  • Yali Amit
  • Massimo Mascaro
چکیده

We describe a system of thousands of binary perceptrons with coarse-oriented edges as input that is able to recognize shapes, even in a context with hundreds of classes. The perceptrons have randomized feedforward connections from the input layer and form a recurrent network among themselves. Each class is represented by a prelearned attractor (serving as an associative hook) in the recurrent net corresponding to a randomly selected subpopulation of the perceptrons. In training, first the attractor of the correct class is activated among the perceptrons; then the visual stimulus is presented at the input layer. The feedforward connections are modified using field-dependent Hebbian learning with positive synapses, which we show to be stable with respect to large variations in feature statistics and coding levels and allows the use of the same threshold on all perceptrons. Recognition is based on only the visual stimuli. These activate the recurrent network, which is then driven by the dynamics to a sustained attractor state, concentrated in the correct class subset and providing a form of working memory. We believe this architecture is more transparent than standard feedforward two-layer networks and has stronger biological analogies.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Persian Phone Recognition Using Acoustic Landmarks and Neural Network-based variability compensation methods

Speech recognition is a subfield of artificial intelligence that develops technologies to convert speech utterance into transcription. So far, various methods such as hidden Markov models and artificial neural networks have been used to develop speech recognition systems. In most of these systems, the speech signal frames are processed uniformly, while the information is not evenly distributed ...

متن کامل

بهبود بازشناسی مقاوم الگو در شبکه های عصبی بازگشتی جاذب از طریق به کارگیری دینامیک های آشوب گونه

In this paper, two kinds of chaotic neural networks are proposed to evaluate the efficiency of chaotic dynamics in robust pattern recognition. The First model is designed based on natural selection theory. In this model, attractor recurrent neural network, intelligently, guides the evaluation of chaotic nodes in order to obtain the best solution. In the second model, a different structure of ch...

متن کامل

Attractor Density Models with Application to Analyzing the Stability of Biological Neural Networks

An attractor modeling algorithm is introduced which draws upon techniques found in nonlineax dynamics and pattern recognition. The technique is motivated. by the need for quantitative measures that are able to assess the stability of biological neural networks which utilize nonlinear dynamics to process information. Attractor Density Models with Application to Analyzing the Stability of Biologi...

متن کامل

On the use of Textural Features and Neural Networks for Leaf Recognition

for recognizing various types of plants, so automatic image recognition algorithms can extract to classify plant species and apply these features. Fast and accurate recognition of plants can have a significant impact on biodiversity management and increasing the effectiveness of the studies in this regard. These automatic methods have involved the development of recognition techniques and digi...

متن کامل

Distance-based Kernels for Dynamical Movement Primitives

In the Anchoring Problem actions and objects must be anchored to symbols; and movement primitives as DMPs seems a good option to describe actions. In the bottom-up approach to anchoring, the recognition of an action is done applying learning techniques as clustering. Although most work done about movement recognition with DMPs is focus on weights, we propose to use the shape-attractor function ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural computation

دوره 13 6  شماره 

صفحات  -

تاریخ انتشار 2001